# Small-scale efficient training
Orca Mini 3b
orca_mini_3b is a text generation model trained on the OpenLLaMa-3B model, utilizing instructions and inputs from the WizardLM, Alpaca, and Dolly-V2 datasets for explanatory fine-tuning, and applying the dataset construction methods from the Orca research paper.
Large Language Model
Transformers English

O
pankajmathur
4,232
161
Kaz Roberta Conversational
Apache-2.0
Kaz-RoBERTa is a transformers model pre-trained in a self-supervised manner on a large-scale Kazakh corpus, primarily designed for masked language modeling tasks.
Large Language Model
Transformers Other

K
kz-transformers
18.03k
13
Featured Recommended AI Models